Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
García-Ayllón Veintimilla, Salvador (Ed.)Historical information about floods is not commonly used in the US to inform land use planning decisions. Rather, the current approach to managing floods is based on static maps derived from computer simulations of the area inundated by floods of specified return intervals. These maps provide some information about flood hazard, but they do not reflect the underlying processes involved in creating a flood disaster, which typically include increased exposure due to building on flood-prone land, nor do they account for the greater hazard resulting from wildfire. We developed and applied an approach to analyze how exposure has evolved in flood hazard zones in Montecito, California, an area devastated by post-fire debris flows in January 2018. By combining historical flood records of the past 200 years, human development records of the past 100 years, and geomorphological understanding of debris flow generation processes, this approach allows us to look at risk as a dynamic process influenced by physical and human factors, instead of a static map. Results show that floods after fires, in particular debris flows and debris laden floods, are very common in Montecito (15 events in the last 200 years), and that despite policies discouraging developments in hazard areas, developments in hazard zones have increased substantially since Montecito joined the National Flood Insurance Program in 1979.We also highlight the limitation of using conventional Flood Insurance Rate Maps (FIRMs) to manage land use in alluvial fan areas such as Montecito. The knowledge produced in this project can help Montecito residents better understand how they came to be vulnerable to floods and identify action they are taking now that might increase or reduce their vulnerability to the next big flood. This science-history-centric approach to understand hazard and exposure evolution using geographic information systems (GIS) and historical records, is generalizable to other communities seeking to better understand the nature of the hazard they are exposed to and some of the root causes of their vulnerabilities, in other words, both the natural and social processes producing disasters.more » « less
-
Abstract The European spallation source (ESS) will be the world’s brightest neutron source and will open a new intensity frontier in particle physics. The HIBEAM collaboration aims to exploit the unique potential of the ESS with a dedicated ESS instrument for particle physics which offers world-leading capability in a number of areas. The HIBEAM program includes the first search in thirty years for free neutrons converting to antineutrons and searches for sterile neutrons, ultralight axion dark matter and nonzero neutron electric charge. This paper outlines the capabilities, design, infrastructure, and scientific potential of the HIBEAM program, including its dedicated beamline, neutron optical system, magnetic shielding and control, and detectors for neutrons and antineutrons. Additionally, we discuss the long-term scientific exploitation of HIBEAM, which may include measurements of the neutron electric dipole moment and precision studies of neutron decays.more » « less
-
null (Ed.)Electric propulsion devices using xenon as a propellant are a high efficiency solution for large conventional satellites. The high storage density of iodine would enable these devices to require less mass for use in space technologies, if used as a propellant as an alternative to xenon. The ability to reduce the mass required for electric propulsion devices would not only reduce costs of space travel but also open up new opportunities for these devices to be used in smaller, more volume constrained missions. Iodine is a strong oxidizing agent. To determine if it is a viable alternative, the erosive properties must be quantified. The object of this research project was to characterize an iodine plasma source before using it for material exposure and analysis. A double Langmuir probe was used as the method of data acquisition for the plasma conditions. The plasma characterization identified the conditions in the plasma source that will be used to properly quantify the erosive properties in iodine plasma. Preliminary results indicate a maximum electron temperature of four electron volts, and a maximum plasma density of eight inverse cubic centimeters.more » « less
-
null (Ed.)Continuum robots have long held a great potential for applications in inspection of remote, hard-to-reach environments. In future environments such as the Deep Space Gateway, remote deployment of robotic solutions will require a high level of autonomy due to communication delays and unavailability of human crews. In this work, we explore the application of policy optimization methods through Actor-Critic gradient descent in order to optimize a continuum manipulator’s search method for an unknown object. We show that we can deploy a continuum robot without prior knowledge of a goal object location and converge to a policy that finds the goal and can be reused in future deployments. We also show that the method can be quickly extended for multiple Degrees-of-Freedom and that we can restrict the policy with virtual and physical obstacles. These two scenarios are highlighted using a simulation environment with 15 and 135 unique states, respectively.more » « less
-
We are storing and querying datasets with the private information of individuals at an unprecedented scale in settings ranging from IoT devices in smart homes to mining enormous collections of click trails for targeted advertising. Here, the privacy of the people described in these datasets is usually addressed as an afterthought, engineered on top of a DBMS optimized for performance. At best, these systems support security or managing access to sensitive data. This status quo has brought us a plethora of data breaches in the news. In response, governments are stepping in to enact privacy regulations such as the EU’s GDPR. We posit that there is an urgent need for trustworthy database system that offer end-to-end privacy guarantees for their records with user interfaces that closely resemble that of a relational database. As we shall see, these guarantees inform everything in the database’s design from how we store data to what query results we make available to untrusted clients. In this position paper we first define trustworthy database systems and put their research challenges in the context of relevant tools and techniques from the security community. We then use this backdrop to walk through the “life of a query” in a trustworthy database system. We start with the query parsing and follow the query’s path as the system plans, optimizes, and executes it. We highlight how we will need to rethink each step to make it efficient, robust, and usable for database clients.more » « less
-
A key aim of the HighNESS project for the European Spallation Source is to enable cutting-edge particle physics experiments. This volume presents a conceptual design report for the NNBAR experiment. NNBAR would exploit a new cold lower moderator to make the first search in over thirty years for free neutrons converting to anti-neutrons. The observation of such a baryon-number-violating signature would be of fundamental significance and tackle open questions in modern physics, including the origin of the matter-antimatter asymmetry. This report shows the design of the beamline, supermirror focusing system, magnetic and radiation shielding, and anti-neutron detector necessary for the experiment. A range of simulation programs are employed to quantify the performance of the experiment and show how background can be suppressed. For a search with full background suppression, a sensitivity improvement of three orders of magnitude is expected, as compared with the previous search. Civil engineering studies for the NNBAR beamline are also shown, as is a costing model for the experiment.more » « less
-
Abstract We present a detailed analysis of the radio galaxy PKS $$2250{-}351$$ , a giant of 1.2 Mpc projected size, its host galaxy, and its environment. We use radio data from the Murchison Widefield Array, the upgraded Giant Metre-wavelength Radio Telescope, the Australian Square Kilometre Array Pathfinder, and the Australia Telescope Compact Array to model the jet power and age. Optical and IR data come from the Galaxy And Mass Assembly (GAMA) survey and provide information on the host galaxy and environment. GAMA spectroscopy confirms that PKS $$2250{-}351$$ lies at $z=0.2115$ in the irregular, and likely unrelaxed, cluster Abell 3936. We find its host is a massive, ‘red and dead’ elliptical galaxy with negligible star formation but with a highly obscured active galactic nucleus dominating the mid-IR emission. Assuming it lies on the local M – $$\sigma$$ relation, it has an Eddington accretion rate of $$\lambda_{\rm EDD}\sim 0.014$$ . We find that the lobe-derived jet power (a time-averaged measure) is an order of magnitude greater than the hotspot-derived jet power (an instantaneous measure). We propose that over the lifetime of the observed radio emission ( $${\sim} 300\,$$ Myr), the accretion has switched from an inefficient advection-dominated mode to a thin disc efficient mode, consistent with the decrease in jet power. We also suggest that the asymmetric radio morphology is due to its environment, with the host of PKS $$2250{-}351$$ lying to the west of the densest concentration of galaxies in Abell 3936.more » « less
An official website of the United States government

Full Text Available